Level methods uniformly optimal for composite and structured nonsmooth convex optimization
نویسنده
چکیده
The main goal of this paper is to develop uniformly optimal first-order methods for large-scale convex programming (CP). By uniform optimality we mean that the first-order methods themselves do not require the input of any problem parameters, but can still achieve the best possible iteration complexity bounds. To this end, we provide a substantial generalization of the accelerated level method by Lan [15] and demonstrate that it can uniformly achieve the optimal iteration complexity for solving a class of generalized composite CP problems, which covers a wide range of CP problems, including the nonsmooth, weakly smooth, smooth, minmax, composite and regularized problems etc. Then, we present two variants of this level method for solving a class of structured CP problems with a bilinear saddle point structure due to Nesterov [36]. We show that one of these variants can achieve the O(1/ ) iteration complexity without requiring the input of any problem parameters. We illustrate the significant advantages of these level methods over some existing first-order methods for solving certain important classes of semidefinite programming (SDP) and two-stage stochastic programming (SP) problems.
منابع مشابه
Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
The main goal of this paper is to develop uniformly optimal first-order methods for convex programming (CP). By uniform optimality we mean that the first-order methods themselves do not require the input of any problem parameters, but can still achieve the best possible iteration complexity bounds. By incorporating a multi-step acceleration scheme into the well-known bundle-level method, we dev...
متن کاملOptimal subgradient algorithms with application to large-scale linear inverse problems
This study addresses some algorithms for solving structured unconstrained convex optimization problems using first-order information where the underlying function includes high-dimensional data. The primary aim is to develop an implementable algorithmic framework for solving problems with multiterm composite objective functions involving linear mappings using the optimal subgradient algorithm, ...
متن کاملAn efficient one-layer recurrent neural network for solving a class of nonsmooth optimization problems
Constrained optimization problems have a wide range of applications in science, economics, and engineering. In this paper, a neural network model is proposed to solve a class of nonsmooth constrained optimization problems with a nonsmooth convex objective function subject to nonlinear inequality and affine equality constraints. It is a one-layer non-penalty recurrent neural network based on the...
متن کاملAccelerated gradient sliding for structured convex optimization
Our main goal in this paper is to show that one can skip gradient computations for gradient descent type methods applied to certain structured convex programming (CP) problems. To this end, we first present an accelerated gradient sliding (AGS) method for minimizing the summation of two smooth convex functions with different Lipschitz constants. We show that the AGS method can skip the gradient...
متن کاملBundle-type methods uniformly optimal for smooth and nonsmooth convex optimization
The bundle-level method and its certain variants are known to exhibit an optimal rate of convergence, i.e., O(1/ √ t), and also excellent practical performance for solving general non-smooth convex programming (CP) problems. However, this rate of convergence is significantly worse than the optimal one for solving smooth CP problems, i.e., O(1/t). In this paper, we present new bundle-type method...
متن کامل